|
Least squares support vector machines (LS-SVM) are least squares versions of support vector machines (SVM), which are a set of related supervised learning methods that analyze data and recognize patterns, and which are used for classification and regression analysis. In this version one finds the solution by solving a set of linear equations instead of a convex quadratic programming (QP) problem for classical SVMs. Least squares SVM classifiers, were proposed by Suykens and Vandewalle.〔Suykens, J.A.K.; Vandewalle, J. (1999) "Least squares support vector machine classifiers", ''Neural Processing Letters'', 9 (3), 293-300.〕 LS-SVMs are a class of kernel-based learning methods. ==From support vector machine to least squares support vector machine== Given a training set with input data and corresponding binary class labels , the SVM〔Vapnik, V. The nature of statistical learning theory. Springer-Verlag, New York, 1995〕 classifier, according to Vapnik’s original formulation, satisfies the following conditions: : Which is equivalent to : where is the nonlinear map from original space to the high (and possibly infinite) dimensional space. 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Least squares support vector machine」の詳細全文を読む スポンサード リンク
|